Merged
Conversation
added 20 commits
April 22, 2026 00:55
Adds the internal/railway package with a GraphQL client against Railway's Backboard v2 endpoint, strongly-typed models covering the full Workspace->Project->Environment->Service->Deployment hierarchy, and a per-workspace conversation history store with atomic persistence.
Adds RailwayCredentials struct, ProviderRailway constant, and StoreRailwayCredentials / GetRailwayCredentials on the backend client so the credentials subcommand can round-trip Railway account tokens through the clanker backend like the other providers.
…mmand storeRailwayCredentials resolves the account token from flags, config, or RAILWAY_API_TOKEN and persists via the backend client. testRailwayCredentials verifies the token by POSTing a minimal GraphQL me query to backboard.railway.com/graphql/v2 and decoding the response envelope (honouring Railway's 200-with-errors shape).
Adds Railway to ServiceContext, registers railway-specific keywords (railway, railway.app, railway project/service/deployment/volume/ environment/plugin, nixpacks, railway.json/.toml), triggers LLM classification when Railway is inferred, and clears other cloud flags when classification lands on railway.
Adds --railway flag and the handleRailwayQuery path (mirrors Vercel): resolve token -> create client -> fetch context -> load per-workspace conversation history -> build system prompt -> dispatch to configured AI provider -> persist exchange. Registers clanker_railway_ask / clanker_railway_list MCP tools covering projects, services, deployments, domains, variables, volumes, and workspaces. Registers the railway cobra tree and the phase-1 railway ask stub so the subcommand is discoverable alongside vercel.
The --railway flag was read via cmd.Flags().GetBool but never registered with Flags().Bool, so `clanker ask --railway ...` surfaced as an unknown flag instead of routing to handleRailwayQuery. Also threads Railway through the credentials-subcommand long descriptions so help output lists it alongside the other providers.
…tion) Adds the --railway branch to --maker mode explicit provider routing, keyword-based provider inference, LLM classification prompt, plan prompt dispatcher (RailwayPlanPromptWithMode), and plan execution dispatcher (ExecuteRailwayPlan). Without these glue points the maker pipeline could not emit or apply Railway plans even though exec_railway.go and railway_prompts.go were already in place.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Full Railway provider integration for the CLI mirroring the Vercel implementation.
internal/railway/package: GraphQL client (curl shell-out tobackboard.railway.com/graphql/v2), types, static cobra command tree (list, get, logs, analytics, deploy, redeploy, cancel, variable, domain, environment), conversation history with atomic writes.internal/maker/exec_railway.go+railway_prompts.go: plan execution with command validation (first arg must berailway, shell metacharacter rejection, destructive verb guarding, unresolved placeholder detection). LLM prompt examples coveringrailway up,redeploy,cancel,variable set/rm,domain add/rm,environment new/rm,link.cmd/ask.go:--railwayflag +handleRailwayQuerywith per-workspace conversation history and context gathering, mirrorshandleVercelQuery.cmd/credentials.go:storeRailwayCredentials/testRailwayCredentialsvia GraphQLmequery, backend round-trip.cmd/mcp.go:clanker_railway_askandclanker_railway_listtools (projects, services, deployments, domains, variables, volumes, workspaces).internal/routing/routing.go:railwayKeywords,ServiceContext.Railwayfield, classifier + LLM-disambiguation branch.cmd/railway.go: phase-1 ask stub subcommand.RailwayCredentials.RAILWAY_API_TOKEN(account/workspace scope), notRAILWAY_TOKENwhich is project-scoped.Test plan
clanker credentials store railway --api-token \$TOKENstores successfullyclanker credentials test railwayreportsPASSED: authenticated as <name>clanker railway list projectsprints project listclanker railway list services --project <id>prints servicesclanker railway logs <deployment-id>streams log entriesclanker ask --railway "what projects do we have?"returns AI summary from Railway contextclanker ask --railway "..."sees prior conversationclanker ask "deploy this to railway"in maker mode generates arailway upplan, validates, executesclanker_railway_listwithresource=projectsreturns workspace projectsPaired with desktop PR: https://github.com/clankercloud/clanker-cloud/pull/371